Boosted Mixture Of Experts: An Ensemble Learning Scheme

نویسندگان

  • Ran Avnimelech
  • Nathan Intrator
چکیده

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hintnon, 1991), applied to classification, or a variant of the boosting algorithm (Schapire, 1990). As a variant of the mixture of experts, it can be made appropriate for general classification and regression problems by initializing the partition of the data set to different experts in a boostlike manner. If viewed as a variant of the boosting algorithm, its main gain is the use of a dynamic combination model for the outputs of the networks. Results are demonstrated on a synthetic example and a digit recognition task from the NIST database and compared with classifical ensemble approaches.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of Ensemble Approaches: Mixture of Experts and AdaBoost for a Regression Problem

Two machine learning approaches: mixture of experts and AdaBoost.R2 were adjusted to the real-world regression problem of predicting the prices of residential premises based on historical data of sales/purchase transactions. The computationally intensive experiments were conducted aimed to compare empirically the prediction accuracy of ensemble models generated by the methods. The analysis of t...

متن کامل

High-Dimensional Unsupervised Active Learning Method

In this work, a hierarchical ensemble of projected clustering algorithm for high-dimensional data is proposed. The basic concept of the algorithm is based on the active learning method (ALM) which is a fuzzy learning scheme, inspired by some behavioral features of human brain functionality. High-dimensional unsupervised active learning method (HUALM) is a clustering algorithm which blurs the da...

متن کامل

افزایش نرخ کارایی طبقه بندی با استفاده از تجمیع ویژگی های موثر روش های مختلف ترکیب شبکه های عصبی

Both theoretical and experimental studies have shown that combining accurate Neural Networks (NN) in the ensemble with negative error correlation greatly improves their generalization abilities. Negative Correlation Learning (NCL) and Mixture of Experts (ME), two popular combining methods, each employ different special error functions for the simultaneous training of NN experts to produce negat...

متن کامل

Combining Classifiers and Learning Mixture-of-Experts

Expert combination is a classic strategy that has been widely used in various problem solving tasks. A team of individuals with diverse and complementary skills tackle a task jointly such that a performance better than any single individual can make is achieved via integrating the strengths of individuals. Started from the late 1980’ in the handwritten character recognition literature, studies ...

متن کامل

Solving Regression Problems Using Competitive Ensemble Models

The use of ensemble models in many problem domains has increased significantly in the last few years. The ensemble modeling, in particularly boosting, has shown a great promise in improving predictive performance of a model. Combining the ensemble members is normally done in a co–operative fashion where each of the ensemble members performs the same task and their predictions are aggregated to ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neural computation

دوره 11 2  شماره 

صفحات  -

تاریخ انتشار 1999